85 research outputs found

    On optimality of kernels for approximate Bayesian computation using sequential Monte Carlo

    Get PDF
    Approximate Bayesian computation (ABC) has gained popularity over the past few years for the analysis of complex models arising in population genetics, epidemiology and system biology. Sequential Monte Carlo (SMC) approaches have become work-horses in ABC. Here we discuss how to construct the perturbation kernels that are required in ABC SMC approaches, in order to construct a sequence of distributions that start out from a suitably defined prior and converge towards the unknown posterior. We derive optimality criteria for different kernels, which are based on the Kullback-Leibler divergence between a distribution and the distribution of the perturbed particles. We will show that for many complicated posterior distributions, locally adapted kernels tend to show the best performance. We find that the added moderate cost of adapting kernel functions is easily regained in terms of the higher acceptance rate. We demonstrate the computational efficiency gains in a range of toy examples which illustrate some of the challenges faced in real-world applications of ABC, before turning to two demanding parameter inference problems in molecular biology, which highlight the huge increases in efficiency that can be gained from choice of optimal kernels. We conclude with a general discussion of the rational choice of perturbation kernels in ABC SMC settings

    JRA3 Electromagnetic Calorimeter Technical Design Report

    No full text
    This report describes the design of the prototype for an Silicon Tungsten electromagnetic calorimeter with unprecedented high granularity to be operated in a detector at the International Linear Collider (ILC). The R&D for the prototype is co-funded by the European Union in the FP6 framework within the so called EUDET project in the years 2006-2010. The dimensions of the prototype are similar to those envisaged for the final detector. Already at this stage the prototype features a highly compact design. The sensitive layers, the Very Front End Electronics serving 64 channels per ASIC and copper plates for heat draining are integrated within 2000 ÎŒm

    High flux polarized gamma rays production: first measurements with a four-mirror cavity at the ATF

    Get PDF
    The next generation of e+/e- colliders will require a very intense flux of gamma rays to allow high current polarized positrons to be produced. This can be achieved by converting polarized high energy photons in polarized pairs into a target. In that context, an optical system consisting of a laser and a four-mirror passive Fabry-Perot cavity has recently been installed at the Accelerator Test Facility (ATF) at KEK to produce a high flux of polarized gamma rays by inverse Compton scattering. In this contribution, we describe the experimental system and present preliminary results. An ultra-stable four-mirror non planar geometry has been implemented to ensure the polarization of the gamma rays produced. A fiber amplifier is used to inject about 10W in the high finesse cavity with a gain of 1000. A digital feedback system is used to keep the cavity at the length required for the optimal power enhancement. Preliminary measurements show that a flux of about 4×106γ4\times10^6 \gamma/s with an average energy of about 24 MeV was generated. Several upgrades currently in progress are also described

    Applying machine learning to automated segmentation of head and neck tumour volumes and organs at risk on radiotherapy planning CT and MRI scans

    Get PDF
    Radiotherapy is one of the main ways head and neck cancers are treated; radiation is used to kill cancerous cells and prevent their recurrence. Complex treatment planning is required to ensure that enough radiation is given to the tumour, and little to other sensitive structures (known as organs at risk) such as the eyes and nerves which might otherwise be damaged. This is especially difficult in the head and neck, where multiple at-risk structures often lie in extremely close proximity to the tumour. It can take radiotherapy experts four hours or more to pick out the important areas on planning scans (known as segmentation). This research will focus on applying machine learning algorithms to automatic segmentation of head and neck planning computed tomography (CT) and magnetic resonance imaging (MRI) scans at University College London Hospital NHS Foundation Trust patients. Through analysis of the images used in radiotherapy DeepMind Health will investigate improvements in efficiency of cancer treatment pathways

    Some discussions of D. Fearnhead and D. Prangle's Read Paper "Constructing summary statistics for approximate Bayesian computation: semi-automatic approximate Bayesian computation"

    Get PDF
    This report is a collection of comments on the Read Paper of Fearnhead and Prangle (2011), to appear in the Journal of the Royal Statistical Society Series B, along with a reply from the authors

    Automated analysis of retinal imaging using machine learning techniques for computer vision

    Get PDF
    There are almost two million people in the United Kingdom living with sight loss, including around 360,000 people who are registered as blind or partially sighted. Sight threatening diseases, such as diabetic retinopathy and age related macular degeneration have contributed to the 40% increase in outpatient attendances in the last decade but are amenable to early detection and monitoring. With early and appropriate intervention, blindness may be prevented in many cases. Ophthalmic imaging provides a way to diagnose and objectively assess the progression of a number of pathologies including neovascular (“wet”) age-related macular degeneration (wet AMD) and diabetic retinopathy. Two methods of imaging are commonly used: digital photographs of the fundus (the ‘back’ of the eye) and Optical Coherence Tomography (OCT, a modality that uses light waves in a similar way to how ultrasound uses sound waves). Changes in population demographics and expectations and the changing pattern of chronic diseases creates a rising demand for such imaging. Meanwhile, interrogation of such images is time consuming, costly, and prone to human error. The application of novel analysis methods may provide a solution to these challenges. This research will focus on applying novel machine learning algorithms to automatic analysis of both digital fundus photographs and OCT in Moorfields Eye Hospital NHS Foundation Trust patients. Through analysis of the images used in ophthalmology, along with relevant clinical and demographic information, Google DeepMind Health will investigate the feasibility of automated grading of digital fundus photographs and OCT and provide novel quantitative measures for specific disease features and for monitoring the therapeutic success

    Implicit particle methods and their connection with variational data assimilation

    Full text link
    The implicit particle filter is a sequential Monte Carlo method for data assimilation that guides the particles to the high-probability regions via a sequence of steps that includes minimizations. We present a new and more general derivation of this approach and extend the method to particle smoothing as well as to data assimilation for perfect models. We show that the minimizations required by implicit particle methods are similar to the ones one encounters in variational data assimilation and explore the connection of implicit particle methods with variational data assimilation. In particular, we argue that existing variational codes can be converted into implicit particle methods at a low cost, often yielding better estimates, that are also equipped with quantitative measures of the uncertainty. A detailed example is presented

    Non-planar four-mirror optical cavity for high intensity gamma ray flux production by pulsed laser beam Compton scattering off GeV-electrons

    Full text link
    As part of the R&D toward the production of high flux of polarised Gamma-rays we have designed and built a non-planar four-mirror optical cavity with a high finesse and operated it at a particle accelerator. We report on the main challenges of such cavity, such as the design of a suitable laser based on fiber technology, the mechanical difficulties of having a high tunability and a high mechanical stability in an accelerator environment and the active stabilization of such cavity by implementing a double feedback loop in a FPGA

    Shower development of particles with momenta from 15 GeV to 150 GeV in the CALICE scintillator-tungsten hadronic calorimeter

    Full text link
    We present a study of showers initiated by electrons, pions, kaons, and protons with momenta from 15 GeV to 150 GeV in the highly granular CALICE scintillator-tungsten analogue hadronic calorimeter. The data were recorded at the CERN Super Proton Synchrotron in 2011. The analysis includes measurements of the calorimeter response to each particle type as well as measurements of the energy resolution and studies of the longitudinal and radial shower development for selected particles. The results are compared to Geant4 simulations (version 9.6.p02). In the study of the energy resolution we include previously published data with beam momenta from 1 GeV to 10 GeV recorded at the CERN Proton Synchrotron in 2010.Comment: 35 pages, 21 figures, 8 table
    • 

    corecore